331 research outputs found

    Comment: Microarrays, Empirical Bayes and the Two-Group Model

    Get PDF
    Comment on ``Microarrays, Empirical Bayes and the Two-Group Model'' [arXiv:0808.0572]Comment: Published in at http://dx.doi.org/10.1214/07-STS236C the Statistical Science (http://www.imstat.org/sts/) by the Institute of Mathematical Statistics (http://www.imstat.org

    Comment: Microarrays, Empirical Bayes and the Two-Groups Model

    Full text link
    Comment on ``Microarrays, Empirical Bayes and the Two-Groups Model'' [arXiv:0808.0572]Comment: Published in at http://dx.doi.org/10.1214/07-STS236A the Statistical Science (http://www.imstat.org/sts/) by the Institute of Mathematical Statistics (http://www.imstat.org

    Minimax and Adaptive Inference in Nonparametric Function Estimation

    Get PDF
    Since Stein's 1956 seminal paper, shrinkage has played a fundamental role in both parametric and nonparametric inference. This article discusses minimaxity and adaptive minimaxity in nonparametric function estimation. Three interrelated problems, function estimation under global integrated squared error, estimation under pointwise squared error, and nonparametric confidence intervals, are considered. Shrinkage is pivotal in the development of both the minimax theory and the adaptation theory. While the three problems are closely connected and the minimax theories bear some similarities, the adaptation theories are strikingly different. For example, in a sharp contrast to adaptive point estimation, in many common settings there do not exist nonparametric confidence intervals that adapt to the unknown smoothness of the underlying function. A concise account of these theories is given. The connections as well as differences among these problems are discussed and illustrated through examples.Comment: Published in at http://dx.doi.org/10.1214/11-STS355 the Statistical Science (http://www.imstat.org/sts/) by the Institute of Mathematical Statistics (http://www.imstat.org

    Structured Matrix Completion with Applications to Genomic Data Integration

    Get PDF
    Matrix completion has attracted significant recent attention in many fields including statistics, applied mathematics and electrical engineering. Current literature on matrix completion focuses primarily on independent sampling models under which the individual observed entries are sampled independently. Motivated by applications in genomic data integration, we propose a new framework of structured matrix completion (SMC) to treat structured missingness by design. Specifically, our proposed method aims at efficient matrix recovery when a subset of the rows and columns of an approximately low-rank matrix are observed. We provide theoretical justification for the proposed SMC method and derive lower bound for the estimation errors, which together establish the optimal rate of recovery over certain classes of approximately low-rank matrices. Simulation studies show that the method performs well in finite sample under a variety of configurations. The method is applied to integrate several ovarian cancer genomic studies with different extent of genomic measurements, which enables us to construct more accurate prediction rules for ovarian cancer survival.Comment: Accepted for publication in Journal of the American Statistical Associatio

    Sharp RIP Bound for Sparse Signal and Low-Rank Matrix Recovery

    Get PDF
    This paper establishes a sharp condition on the restricted isometry property (RIP) for both the sparse signal recovery and low-rank matrix recovery. It is shown that if the measurement matrix AA satisfies the RIP condition δkA<1/3\delta_k^A<1/3, then all kk-sparse signals β\beta can be recovered exactly via the constrained ℓ1\ell_1 minimization based on y=Aβy=A\beta. Similarly, if the linear map M\cal M satisfies the RIP condition δrM<1/3\delta_r^{\cal M}<1/3, then all matrices XX of rank at most rr can be recovered exactly via the constrained nuclear norm minimization based on b=M(X)b={\cal M}(X). Furthermore, in both cases it is not possible to do so in general when the condition does not hold. In addition, noisy cases are considered and oracle inequalities are given under the sharp RIP condition.Comment: to appear in Applied and Computational Harmonic Analysis (2012

    Adaptive variance function estimation in heteroscedastic nonparametric regression

    Get PDF
    We consider a wavelet thresholding approach to adaptive variance function estimation in heteroscedastic nonparametric regression. A data-driven estimator is constructed by applying wavelet thresholding to the squared first-order differences of the observations. We show that the variance function estimator is nearly optimally adaptive to the smoothness of both the mean and variance functions. The estimator is shown to achieve the optimal adaptive rate of convergence under the pointwise squared error simultaneously over a range of smoothness classes. The estimator is also adaptively within a logarithmic factor of the minimax risk under the global mean integrated squared error over a collection of spatially inhomogeneous function classes. Numerical implementation and simulation results are also discussed.Comment: Published in at http://dx.doi.org/10.1214/07-AOS509 the Annals of Statistics (http://www.imstat.org/aos/) by the Institute of Mathematical Statistics (http://www.imstat.org
    • …
    corecore